Layer-Wise Relevance Propagation Based Sample Condensation for Kernel Machines
نویسندگان
چکیده
Kernel machines are a powerful class of methods for classification and regression. Making kernel fast scalable to large data, however, is still challenging problem due the need storing operating on Gram matrix. In this paper we propose novel approach sample condensation machines, preferably without impairing performance. To our best knowledge, there no previous work with same goal reported in literature. For purpose make use neural network interpretation machines. Explainable AI techniques, particular Layer-wise Relevance Propagation method, used measure relevance (importance) training samples. Given measure, decremental strategy proposed condensation. Experimental results three data sets show that able achieve substantial reduction number
منابع مشابه
Layer-wise Relevance Propagation for Deep Neural Network Architectures
We present the application of layer-wise relevance propagation to several deep neural networks such as the BVLC reference neural net and googlenet trained on ImageNet and MIT Places datasets. Layerwise relevance propagation is a method to compute scores for image pixels and image regions denoting the impact of the particular image region on the prediction of the classifier for one particular te...
متن کاملOn Pixel-Wise Explanations for Non-Linear Classifier Decisions by Layer-Wise Relevance Propagation
Understanding and interpreting classification decisions of automated image classification systems is of high value in many applications, as it allows to verify the reasoning of the system and provides additional information to the human expert. Although machine learning methods are solving very successfully a plethora of tasks, they have in most cases the disadvantage of acting as a black box, ...
متن کاملThe Oil Layer Recognition Based on Multi-kernel Function Relevance Vector Machines
In the oil layer recognition, Relevance vector machines (RVM) have a good effect. But the single kernel function RVM has some limitations, a kind of multi-kernel function RVM based on particle swarm optimization (PSO) is proposed, which includes the model parameter estimation, model optimization on multi-kernel function RVM, PSO-based training, and recognition. The results of simulation experim...
متن کاملLayer-Wise Relevance Propagation for Neural Networks with Local Renormalization Layers
Layer-wise relevance propagation is a framework which allows to decompose the prediction of a deep neural network computed over a sample, e.g. an image, down to relevance scores for the single input dimensions of the sample such as subpixels of an image. While this approach can be applied directly to generalized linear mappings, product type non-linearities are not covered. This paper proposes ...
متن کاملBeyond saliency: understanding convolutional neural networks from saliency prediction on layer-wise relevance propagation
Despite the tremendous achievements of deep convolutional neural networks (CNNs) in most of computer vision tasks, understanding how they actually work remains a significant challenge. In this paper, we propose a novel two-step visualization method that aims to shed light on how deep CNNs recognize images and the objects therein. We start out with a layer-wise relevance propagation (LRP) step w...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Lecture Notes in Computer Science
سال: 2021
ISSN: ['1611-3349', '0302-9743']
DOI: https://doi.org/10.1007/978-3-030-89128-2_47